07 March 2024

The Digital Services Act as a Global Transparency Regime

On both sides of the Atlantic, policymakers are struggling to reign in the power of large online platforms and technology companies. Transparency obligations have emerged as a key policy tool that may support or enable achieving this goal. The core argument of this blog is that the Digital Services Act (DSA) creates, at least in part, a global transparency regime. This has implications for transatlantic dialogues and cooperation on matters concerning platform governance. Regulators, researchers, and civil society organizations may be able to use the DSA transparency rules to improve responsiveness of large platforms and other technology companies to the public values of the larger societies that they serve.

In the United States (US), several members of Congress have proposed bills, including the Platform Accountability and Transparency Act, the Social Media Data Act, the Digital Services Oversight or the Safety Act, and Kids Online Safety Act (KOSA), that would increase transparency obligations about platform content moderation practices, online advertising, and safeguards to protect personal data and children. None of these bills has been enacted, although KOSA is under active consideration.

The main regulatory agency in the US that has engaged in online platform regulation has been the Federal Trade Commission (FTC), which has investigatory powers to demand transparency from platforms or other large companies when they may have engaged in unfair or deceptive practices. The President also has authority to issue Executive Orders, which sometimes includes rules that require technology developers to be more transparent.

Yet, now that the DSA has come into force, the European Union (EU) has taken a very large step ahead of the US in making data usage and content moderation practices of platforms more transparent. Among the host of DSA mandatory transparency requirements are those that require preparation of transparency reports, the promulgation of a DSA Transparency Database to report on content moderation practices, new rules about data access requirements for regulators and researchers, preparation of audit reports, a digital terms and conditions (T&Cs) database, and the Ad Library. The DSA is a very ambitious policy initiative aimed at cracking open not just one, but many, black boxes.

Although the geographical focus of the DSA is EU member states, some of its transparency provisions may contribute to a global transparency and observability of platforms. The goal of this blog is to examine to what extent the DSA’s transparency provisions can potentially benefit researchers and regulators outside the European Union.

Categories of DSA Transparency Obligations

The transparency obligations in the DSA can usefully be sorted into four categories: 1) consumer-facing transparency obligations; 2) mandatory reporting and information access obligations to national regulators and the European Commission; 3) rights of access to data; and 4) obligations to contribute to public-facing databases of information.

We first discuss the DSA’s consumer-facing transparency obligations that require platforms to provide certain types of information to their users. Some of these obligations target all users. For example, Article 26 of the DSA obliges online platforms to identify advertising as such and to explain their main targeting criteria and how consumers can change these criteria. In addition, Article 27 obliges platforms to set out in their T&Cs the main parameters used in their recommender systems.

Other DSA transparency rights accrue to individual consumers in particular circumstances. For example, Article 32 requires online platforms to inform individual consumers if a product or service they acquired through a platform was illegal. Additionally, Article 16(5) requires platforms to inform users that their content has been taken down.

In principle, these DSA rules are intended to benefit consumers of services established or located in the EU, and they certainly apply to non-European consumers located in the EU.

Although these rules are not directly applicable or enforceable outside the EU, they may potentially benefit non-European consumers through the so-called “Brussels effect” insofar as online platforms decide not to limit these extra transparency rights just to EU consumers. There is no language in most of these provisions that would exclude the applicability of these provisions to consumers located outside the EU.

A second category of transparency obligations includes mandatory reporting and information access obligations to national regulators and the European Commission. Obvious examples are the powers of national Digital Service Coordinators (DSCs) under Article 51 of the DSA to require covered platforms to provide information and explanations upon request. Articles 5 and 67 of the DSA gives the Commission investigatory powers as to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). These information and investigation powers are reserved to national European regulatory authorities and the Commission.

The DSA requires covered online services to prepare reports annually about their compliance with the DSA and to maintain data pertinent to the reports. However, they are not required to submit these materials annually to the Commission or to a DSCs. The online services must, however, provide their compliance reports to EU regulators when so requested to enable regulators to analyze the extent to which the services have complied with DSA obligations. These online services bear the burden and expense of preparing annual reports and maintaining data that may never be reviewed by any EU regulator. The services can never know when (if ever) regulators will make such requests. But they must be ready to comply.

Article 37 requires the online services to hire at their own expense independent auditors to assess their compliance with DSA obligations. It further requires services to provide auditors with access to all data needed to conduct an audit and identifies the kinds of data that should be part of an audit. We worry about the lack of well-established auditing standards akin to those long established for financial auditing. The DSA does not contemplate that these audits would be available to the Commission or to DSCs, but one can imagine EU regulators demanding access to them if the regulators were dissatisfied with an online services’ annual report once they analyzed a requested copy.

VLOPs and VLOSEs must, in accordance with Article 42 of the DSA, also prepare reports on their mandatory systemic risk assessments and mitigation measures, audits and audit implementation reports and consultations, as well as reports on the number of monthly users. The Commission and national DSCs of the countries where the platforms are established may require covered platforms and search engines to supply these reports to European authorities.

Regulators from other countries might, however, be interested in gaining access to the annual reports that the DSA requires covered online services to prepare. Article 40 says EU regulators can only access the reports to assess compliance with the DSA. But would the Commission object if the FTC, for example, demanded access to online services’ annual reports for firms operating in the US? We presume that the FTC could issue a civil investigative demand directly to the services asking for copies of reports prepared for compliance with the DSA.

If the Commission wants to achieve a “Brussels effect” by setting a regulatory standard for other nations to follow, perhaps it would welcome easing the burdens of non-EU regulators in this way.

Systemic risk assessment and monitoring are among the core transparency obligations for VLOPs under the DSA. These requirements respond to growing concerns about the impact of these platforms on the broader information ecosystem and on fundamental rights. This information about systemic risks may potentially be of great interest to regulators outside the EU.

Under Articles 42 (4) and 42 (5) of the DSA, risk assessment information is to become accessible outside the EU three months after platform reports have been submitted to EU authorities, albeit in possibly redacted form. Under the DSA, providers of VLOPs and VLOSEs can, before the reports become public, remove certain parts that might disclose confidential information, pose security risks, or otherwise harm the firms whose reports became public.

The utility of these reports for non-EU regulators will, of course, depend on how extensively platforms excise information from these reports before making them public. Covered platforms and search engines should not, however, edit the reports to prevent non-EU authorities from being able to access information the reports contain unless one of the legitimate rationales for excision applies.

A third category of DSA transparency rules are those that create a right of access to data that is necessary to monitor and assess compliance. Article 40’s access to data provision allows EU policymakers to obtain a deeper level of observability which would address the growing information asymmetries between platforms and society at large. Professors Rieder and Hofman have observed that “[t]he expanding data sets on vast numbers of people and transactions bear the potential for privileged insights into societies’ texture, even if platforms tend to use them only for operational purposes.” These authors suggest that an essential pre-condition for public accountability is the “institutionalisation of reliable information interfaces between digital platforms and society – with a broad mandate to focus on the public interest.”

We believe that the access to data provisions in Article 40 of the DSA should be understood to create such an interface. In addition to DSCs and the Commission, “vetted researchers” can request access to data held by VLOPs and VLOSEs to gauge compliance with DSA obligations.

Article 40 of the DSA contemplates that researchers would submit proposals to DSCs identifying the online service providers whose data they want to access, along with a research plan. Coordinators would then “vet” researchers under the criteria set forth in Article 40(8). Upon being vetted, the coordinators would notify the online services that the vetted researcher should be given access to data for compliance assessment purposes.

The vetting criteria include supplying information about the research organization with which the researcher is affiliated, their independence from commercial interests, sources of funding for their research, the ability to comply with data security and confidentiality rules, and an intent to carry out research for purposes set forth in Article 40(4). To be vetted, researchers must also agree to publish the results of their study without charge within a reasonable time after finishing their research project. This means that the research outputs about DSA compliance will become publicly available to all who may be interested in finding out about how well (or not) platforms did.

Vetted researchers are, however, restricted in the purpose for which they can request access to platform data, for the DSA says vetted researchers can access data only for “the sole purpose of conducting research that contributes to the detection, identification and understanding” of a pre-defined list of systemic risks under Article 34 of the DSA or the assessment of the “adequacy, efficiency and impacts of the risk mitigation measures” that the DSA requires. In other words, research access is only possible to the extent that it contributes to the enforcement of the DSA.

By authorizing DSCs to require online services to grant independent researchers access to data concerning risk assessment and risk mitigation strategies and to publish results of their research, the DSA offloads some burdens that EU regulators might otherwise have to bear to those researchers whom the coordinators vet.

Practically speaking, this strategy raises important questions about the proper role of researchers in enforcement actions, the need to protect academic independence and autonomy, and how to combine the demands of the DSA with the way academic research is conducted, assessed, and funded.

So far as we can tell, the researcher data access rights set forth in Article 40 may be available to researchers outside of the EU. There will almost certainly be US researchers who would want to request access to data under this regime because there are no equivalent data access mandates under US law.

Although the DSA does not define which researchers are eligible for data access rights, it refers to the definition of this term in Article 2(1) of Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market. That provision requires researchers to be affiliated with a “research organization,” such as a university, a research institute, or another entity whose primary goal was to conduct scientific research on a not-for-profit basis or pursuant to a public interest mission recognised by a European member state. There is no explicit requirement that this must be a European university or research entity. Nor does Article 40 (8) say that the DSC can deny an application for data access to non-Europeans (in this sense also  Dergacheva, Katzenbach, Schwemer & Quintais 2023 and Husovec 2023).

Arguably, it is in the interest of EU policymakers to open up Article 40 of the DSA to non-European researchers. A significant share of research that has been conducted on platform auditing originates from the US. Using the extensive expertise and experience of non-EU researchers for the purposes of assessing compliance with the DSA would be very much in the interest of Europe. (More information on how non-EU researchers might exercise the access right can be found here and here.)

A fourth category  of transparency rules of the DSA is the obligation of platforms to make certain information publicly available in data bases and ad archives. Examples are the Ad Archives that mandated by Article 39 of the DSA. Providers of VLOPs and VLOSEs are obliged to make available in a specific portion of their online interface a searchable repository containing information about the content on their online commercial and political advertisements. Also required is disclosure about on whose behalf the advertisement was presented, who paid for the advertisement, groups targeted and targeting parameters, and the total number of recipients. (For an insightful discussion of the design requirements of ad archives, see van Drunen & Noroozian 2024).

Moreover, all platforms covered by the DSA, not just the VLOPs and VLOSEs, must publish statements about reasons for their content moderation actions. Platforms must send those statements to the DSA Transparency Database, which is operated by the Commission under Articles 17 and 24(5) of the DSA. These statements must include information about the type of content moderation restrictions they have adopted, as well as the grounds and the surrounding facts and circumstances that influenced the decision.

Yet another platform transparency resource established by the Commission is the T&Cs Database. Platforms use their T&Cs for registered users as an important source of private governance. The goal of the database is to give the public more information into this element of the legal landscape. Currently, the database includes 790 T&Cs from more than 290 service providers, including Terms of Service, Privacy Policies, but also developer terms.

All of these information resources created and maintained in the EU will be available to anyone in the world who wants to access them.

Does the DSA Have a Global Reach?

The DSA is an ambitious step towards a global transparency regime. A significant share of the transparency obligations in the DSA are not limited to European regulators, consumers and researchers. This includes transparency about platforms’ statements of reasons for taking actions, information about political and commercial ads, T&Cs, audits and systemic risk assessments as well as access to the deeper layers of the algorithmic infrastructure through access to data rights available to stakeholders outside the European Union.

The benefits of transparency for EU and non-EU regulators provided by the DSA may be mutual. By extending the scope of potential observers, the EU too can benefit from the expertise and insights from actors outside the Union.

This more inclusive approach to global transparency resonates with a push for more international coordination and participation in (EU-led) platform governance. In the emerging digital regulatory framework, there are various ways in which non-EU stakeholders, including civil society and potentially non-EU regulators can become involved in and influence EU platform governance.

Under Article 51 (3) of the DSA, for example, DSCs can invite “interested parties” and “any other third parity demonstrating a legitimate interest” to submit written observations on planned enforcement actions and participate in the proceedings. There is nothing in the text that would exclude non-European regulators, such as the FTC, or non-European competitors from taking an active part in the enforcement deliberations of national DSCs.

The Digital Markets Act (DMA) likewise entitles “[a]ny third party” to inform the national competent authority of the Member State or the Commission about “any practice or behaviour by gatekeepers that falls within the scope of this Regulation” in the context of an infringement procedure under Article 27 of the DSA. The European Media Freedom Act (EMFA) foresees explicitly the possibility that the Board could coordinate with non-EU regulators under Article 16 EMFA, and introduces the instrument of so-called ‘structured dialogues that are also open to non-EU civil society actors under Article 18.

In a similar way, the draft AI Act foresees explicitly cooperation and coordination with non-European authorities and international organisations under Article 58e of the AI Act. The planned Advisory Forum and Scientific Panel are also open to non-EU stakeholders under Articles 58 a and b, giving those an influential role in the further implementation and operationalisation of the European approach to AI governance.

Another aspect of the AI Act, which is open to non-EU stakeholders, concerns international standardisation in the field of AI. According to  Art. 40 (1)(c) of the AI Act, the actors involved in the standardisation process must ”contribute to strengthening global cooperation on standardisation and taking into account existing international standards in the field of AI” but also as part of EU-US cooperations such as the EU-U.S. Trade and Technology Council (TTC).

Has the EU, through the DSA and related initiatives, gone a long ways toward achieving a “Next Level Brussels Effect?” From EU regulators’ optimistic view, not only would global platforms adhere to, and export European standards into their operations outside of the Union, but there would be a new push to an EU-led approach in the creation of global observability and governance frameworks through transparency, cooperation, codes of conduct and coordination on standardisation.

While we recognize the ambition and optimism that underlies promulgation of the DSA and related initiatives, these new regulations are still in early stages and the regulatory cultures of the EU, US, and other nations are distinctly different. Some clashes over the burdens and costs that these new rules impose and the impacts of the rules on competition and innovation in information technology industries seem quite likely. We look forward to seeing how they play out in coming years.


SUGGESTED CITATION  Helberger, Natali; Samuelson, Pamela: The Digital Services Act as a Global Transparency Regime, VerfBlog, 2024/3/07, https://verfassungsblog.de/the-digital-services-act-as-a-global-transparency-regime/, DOI: 10.59704/06c97b13f47ed11c.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
DSA, Internet, Internet Regulation, Platform Governance, Transparency


Other posts about this region:
Europa, USA, World